620 research outputs found

    Essays in Applied Bayesian Analysis

    Get PDF
    With continuing rapid developments in computational power, Bayesian statistical methods, because of their user-friendliness and estimation capabilities, have become increasingly popular in a considerable variety of application fields. In this thesis, applied Bayesian methodological topics and empirical examples focusing on nonhomogeneous hidden Markov models (NHMMs) and measurement error models are explored in three chapters. In the first chapter, a subsequence-based variational Bayesian inference framework for NHMMs is proposed in order to address the computational problems encountered when analyzing datasets containing long sequences. The second chapter concentrates on measurement error models, where a Bayesian estimation procedure is proposed for the partial potential impact fraction (pPIF) with the presence of measurement error. The third chapter focuses on an empirical application in marketing, where a coupled nonhomogeneous hidden Markov model (CNHMM) is introduced to provide a novel framework for customer relationship management

    A slave mode expansion for obtaining ab-initio interatomic potentials

    Get PDF
    Here we propose a new approach for performing a Taylor series expansion of the first-principles computed energy of a crystal as a function of the nuclear displacements. We enlarge the dimensionality of the existing displacement space and form new variables (ie. slave modes) which transform like irreducible representations of the space group and satisfy homogeneity of free space. Standard group theoretical techniques can then be applied to deduce the non-zero expansion coefficients a priori. At a given order, the translation group can be used to contract the products and eliminate terms which are not linearly independent, resulting in a final set of slave mode products. While the expansion coefficients can be computed in a variety of ways, we demonstrate that finite difference is effective up to fourth order. We demonstrate the power of the method in the strongly anharmonic system PbTe. All anharmonic terms within an octahedron are computed up to fourth order. A proper unitary transformation demonstrates that the vast majority of the anharmonicity can be attributed to just two terms, indicating that a minimal model of phonon interactions is achievable. The ability to straightforwardly generate polynomial potentials will allow precise simulations at length and time scales which were previously unrealizable

    A Near-Linear Time Sampler for the Ising Model

    Full text link
    We give a near-linear time sampler for the Gibbs distribution of the ferromagnetic Ising models with edge activities β>1\boldsymbol{\beta} > 1 and external fields λ<1\boldsymbol{\lambda}<1 (or symmetrically, λ>1\boldsymbol{\lambda}>1) on general graphs with bounded or unbounded maximum degree. Our algorithm is based on the field dynamics given in [CLV21]. We prove the correctness and efficiency of our algorithm by establishing spectral independence of distribution of the random cluster model and the rapid mixing of Glauber dynamics on the random cluster model in a low-temperature regime, which may be of independent interest

    Distributed Contingency Analysis over Wide Area Network among Dispatch Centers

    Full text link
    Traditionally, a regional dispatch center uses the equivalent method to deal with external grids, which fails to reflect the interactions among regions. This paper proposes a distributed N-1 contingency analysis (DCA) solution, where dispatch centers join a coordinated computation using their private data and computing resources. A distributed screening method is presented to determine the Critical Contingency Set (DCCS) in DCA. Then, the distributed power flow is formulated as a set of boundary equations, which is solved by a Jacobi-Free Newton-GMRES (JFNG) method. During solving the distributed power flow, only boundary conditions are exchanged. Acceleration techniques are also introduced, including reusing preconditioners and optimal resource scheduling during parallel processing of multiple contingencies. The proposed method is implemented on a real EMS platform, where tests using the Southwest Regional Grid of China are carried out to validate its feasibility.Comment: 5 pages, 6 figures, 2017 IEEE PES General Meetin

    Competing risks regression for clustered survival data via the marginal additive subdistribution hazards model

    Full text link
    A population-averaged additive subdistribution hazards model is proposed to assess the marginal effects of covariates on the cumulative incidence function and to analyze correlated failure time data subject to competing risks. This approach extends the population-averaged additive hazards model by accommodating potentially dependent censoring due to competing events other than the event of interest. Assuming an independent working correlation structure, an estimating equations approach is outlined to estimate the regression coefficients and a new sandwich variance estimator is proposed. The proposed sandwich variance estimator accounts for both the correlations between failure times and between the censoring times, and is robust to misspecification of the unknown dependency structure within each cluster. We further develop goodness-of-fit tests to assess the adequacy of the additive structure of the subdistribution hazards for the overall model and each covariate. Simulation studies are conducted to investigate the performance of the proposed methods in finite samples. We illustrate our methods using data from the STrategies to Reduce Injuries and Develop confidence in Elders (STRIDE) trial

    Model on empirically calibrating stochastic traffic flow fundamental diagram

    Get PDF
    This paper addresses two shortcomings of the data-driven stochastic fundamental diagram for freeway traffic. The first shortcoming is related to the least-squares methods which have been widely used in establishing traffic flow fundamental diagrams. We argue that these methods are not suitable to generate the percentile-based stochastic fundamental diagrams, because the results generated by least-squares methods represent weighted sample mean, rather than percentile. The second shortcoming is widespread use of independent modeling methodology for a family of percentile-based fundamental diagrams. Existing methods are inadequate to coordinate the fundamental diagrams in the same family, and consequently, are not in alignment with the basic rules in probability theory and statistics. To address these issues, this paper proposes a holistic modeling framework based on the concept of mean absolute error minimization. The established model is convex, but non-differentiable. To efficiently implement the proposed methodology, we further reformulate this model as a linear programming problem which could be solved by the state-of-the-art solvers. Experimental results using real-world traffic flow data validate the proposed method
    • …
    corecore